Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 145
Filtrar
1.
ALTEX ; 41(2): 179-201, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38629803

RESUMO

When The Principles of Humane Experimental Technique was published in 1959, authors William Russell and Rex Burch had a modest goal: to make researchers think about what they were doing in the laboratory - and to do it more humanely. Sixty years later, their groundbreaking book was celebrated for inspiring a revolution in science and launching a new field: The 3Rs of alternatives to animal experimentation. On November 22, 2019, some pioneering and leading scientists and researchers in the field gathered at the Johns Hopkins Bloomberg School of Public Health in Bal-timore for the 60 Years of the 3Rs Symposium: Lessons Learned and the Road Ahead. The event was sponsored by the Johns Hopkins Center for Alternatives to Animal Testing (CAAT), the Foundation for Chemistry Research and Initiatives, the Alternative Research & Development Foundation (ARDF), the American Cleaning Institute (ACI), the International Fragrance Association (IFRA), the Institute for In Vitro Sciences (IIVS), John "Jack" R. Fowle III, and the Society of Toxicology (SoT). Fourteen pres-entations shared the history behind the groundbreaking publication, international efforts to achieve its aims, stumbling blocks to progress, as well as remarkable achievements. The day was a tribute to Russell and Burch, and a testament to what is possible when people from many walks of life - science, government, and industry - work toward a common goal.


William Russell and Rex Burch published their book The Principles of Humane Experimental Technique in 1959. The book encouraged researchers to replace animal experiments where it was possible, to refine experiments with animals in order to reduce their suffering, and to reduce the number of animals that had to be used for experiments to the minimum. Sixty years later, a group of pioneering and leading scientists and researchers in the field gathered to share how the publi­cation came about and how the vision inspired international collaborations and successes on many different levels including new laws. The paper includes an overview of important milestones in the history of alternatives to animal experimentation.


Assuntos
Experimentação Animal , Alternativas aos Testes com Animais , Animais , Humanos , Alternativas aos Testes com Animais/métodos , Projetos de Pesquisa , Indústrias , Bem-Estar do Animal
2.
Regul Toxicol Pharmacol ; 148: 105579, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38309424

RESUMO

Chemical safety assessment begins with defining the lowest level of chemical that alters one or more measured endpoints. This critical effect level, along with factors to account for uncertainty, is used to derive limits for human exposure. In the absence of data regarding the specific mechanisms or biological pathways affected, non-specific endpoints such as body weight and non-target organ weight changes are used to set critical effect levels. Specific apical endpoints such as impaired reproductive function or altered neurodevelopment have also been used to set chemical safety limits; however, in test guidelines designed for specific apical effect(s), concurrently measured non-specific endpoints may be equally or more sensitive than specific endpoints. This means that rather than predicting a specific toxicological response, animal data are often used to develop protective critical effect levels, without assuming the same change would be observed in humans. This manuscript is intended to encourage a rethinking of how adverse chemical effects are interpreted: non-specific endpoints from in vivo toxicological studies data are often used to derive points of departure for use with safety assessment factors to create recommended exposure levels that are broadly protective but not necessarily target-specific.


Assuntos
Testes de Toxicidade , Animais , Humanos , Medição de Risco
3.
Toxicol Sci ; 2024 Feb 04.
Artigo em Inglês | MEDLINE | ID: mdl-38310358

RESUMO

The success and sustainability of U.S. EPA efforts to reduce, refine, and replace in vivo animal testing depends on the ability to translate toxicokinetic and toxicodynamic data from in vitro and in silico new approach methods (NAMs) to human-relevant exposures and health outcomes. Organotypic culture models employing primary human cells enable consideration of human health effects and inter-individual variability, but present significant challenges for test method standardization, transferability, and validation. Increasing confidence in the information provided by these in vitro NAMs requires setting appropriate performance standards and benchmarks, defined by the context of use, to consider human biology and mechanistic relevance without animal data. The human thyroid microtissue assay utilizes primary human thyrocytes to reproduce structural and functional features of the thyroid gland that enable testing for potential thyroid disrupting chemicals. As a variable-donor assay platform, conventional principles for assay performance standardization need to be balanced with the ability to predict a range of human responses. The objectives of this study were to 1) define the technical parameters for optimal donor procurement, primary thyrocyte qualification, and performance in the human thyroid microtissue assay, and 2) set benchmark ranges for reference chemical responses. Thyrocytes derived from a cohort of 32 demographically-diverse euthyroid donors were characterized across a battery of endpoints to evaluate morphological and functional variability. Reference chemical responses were profiled to evaluate the range and chemical-specific variability of donor-dependent effects within the cohort. The data informed minimum acceptance criteria for donor qualification and set benchmark parameters for method transfer proficiency testing and validation of assay performance.

4.
Annu Rev Pharmacol Toxicol ; 64: 191-209, 2024 Jan 23.
Artigo em Inglês | MEDLINE | ID: mdl-37506331

RESUMO

Traditionally, chemical toxicity is determined by in vivo animal studies, which are low throughput, expensive, and sometimes fail to predict compound toxicity in humans. Due to the increasing number of chemicals in use and the high rate of drug candidate failure due to toxicity, it is imperative to develop in vitro, high-throughput screening methods to determine toxicity. The Tox21 program, a unique research consortium of federal public health agencies, was established to address and identify toxicity concerns in a high-throughput, concentration-responsive manner using a battery of in vitro assays. In this article, we review the advancements in high-throughput robotic screening methodology and informatics processes to enable the generation of toxicological data, and their impact on the field; further, we discuss the future of assessing environmental toxicity utilizing efficient and scalable methods that better represent the corresponding biological and toxicodynamic processes in humans.


Assuntos
Ensaios de Triagem em Larga Escala , Toxicologia , Animais , Humanos , Ensaios de Triagem em Larga Escala/métodos , Toxicologia/métodos
5.
Comput Toxicol ; 28: 1-17, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37990691

RESUMO

This work estimates benchmarks for new approach method (NAM) performance in predicting organ-level effects in repeat dose studies of adult animals based on variability in replicate animal studies. Treatment-related effect values from the Toxicity Reference database (v2.1) for weight, gross, or histopathological changes in the adrenal gland, liver, kidney, spleen, stomach, and thyroid were used. Rates of chemical concordance among organ-level findings in replicate studies, defined by repeated chemical only, chemical and species, or chemical and study type, were calculated. Concordance was 39 - 88%, depending on organ, and was highest within species. Variance in treatment-related effect values, including lowest effect level (LEL) values and benchmark dose (BMD) values when available, was calculated by organ. Multilinear regression modeling, using study descriptors of organ-level effect values as covariates, was used to estimate total variance, mean square error (MSE), and root residual mean square error (RMSE). MSE values, interpreted as estimates of unexplained variance, suggest study descriptors accounted for 52-69% of total variance in organ-level LELs. RMSE ranged from 0.41 - 0.68 log10-mg/kg/day. Differences between organ-level effects from chronic (CHR) and subchronic (SUB) dosing regimens were also quantified. Odds ratios indicated CHR organ effects were unlikely if the SUB study was negative. Mean differences of CHR - SUB organ-level LELs ranged from -0.38 to -0.19 log10 mg/kg/day; the magnitudes of these mean differences were less than RMSE for replicate studies. Finally, in vitro to in vivo extrapolation (IVIVE) was employed to compare bioactive concentrations from in vitro NAMs for kidney and liver to LELs. The observed mean difference between LELs and mean IVIVE dose predictions approached 0.5 log10-mg/kg/day, but differences by chemical ranged widely. Overall, variability in repeat dose organ-level effects suggests expectations for quantitative accuracy of NAM prediction of LELs should be at least ± 1 log10-mg/kg/day, with qualitative accuracy not exceeding 70%.

6.
Toxicol In Vitro ; 92: 105659, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37557933

RESUMO

The H295R test guideline assay evaluates the effect of test substances on synthesis of 17ß-estradiol (E2) and testosterone (T). The objective of this study was to leverage commercial immunoassay technology to develop a more efficient H295R assay to measure E2 and T levels in 384-well format. The resulting Homogenous Time Resolved Fluorescence assay platform (H295R-HTRF) was evaluated against a training set of 36 chemicals derived from the OECD inter-laboratory validation study, EPA guideline 890.1200 aromatase assay, and azole fungicides active in the HT-H295R assay. Quality control performance criteria were met for all conditions except E2 synthesis inhibition where low basal hormone synthesis was observed. Five proficiency chemicals were active for both the E2 and T endpoints, consistent with guideline classifications. Of the nine OECD core reference chemicals, 9/9 were concordant with outcomes for E2 and 7/9 for T. Likewise, 9/13 and 11/13 OECD supplemental chemicals were concordant with anticipated effects for E2 and T, respectively. Of the 10 azole fungicides screened, 7/10 for E2 and 8/10 for T exhibited concordant outcomes for inhibition. Generally, all active chemicals in the training set demonstrated equivalent or greater potency in the H295R-HTRF assay, supporting the sensitivity of the platform. The adaptation of HTRF technology to the H295R model provides an efficient way to evaluate E2 and T modulators in accordance with guideline specifications.


Assuntos
Disruptores Endócrinos , Fungicidas Industriais , Androgênios , Linhagem Celular Tumoral , Estrogênios , Estradiol , Testosterona , Azóis/farmacologia
7.
Environ Int ; 178: 108082, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-37422975

RESUMO

The predominantly animal-centric approach of chemical safety assessment has increasingly come under pressure. Society is questioning overall performance, sustainability, continued relevance for human health risk assessment and ethics of this system, demanding a change of paradigm. At the same time, the scientific toolbox used for risk assessment is continuously enriched by the development of "New Approach Methodologies" (NAMs). While this term does not define the age or the state of readiness of the innovation, it covers a wide range of methods, including quantitative structure-activity relationship (QSAR) predictions, high-throughput screening (HTS) bioassays, omics applications, cell cultures, organoids, microphysiological systems (MPS), machine learning models and artificial intelligence (AI). In addition to promising faster and more efficient toxicity testing, NAMs have the potential to fundamentally transform today's regulatory work by allowing more human-relevant decision-making in terms of both hazard and exposure assessment. Yet, several obstacles hamper a broader application of NAMs in current regulatory risk assessment. Constraints in addressing repeated-dose toxicity, with particular reference to the chronic toxicity, and hesitance from relevant stakeholders, are major challenges for the implementation of NAMs in a broader context. Moreover, issues regarding predictivity, reproducibility and quantification need to be addressed and regulatory and legislative frameworks need to be adapted to NAMs. The conceptual perspective presented here has its focus on hazard assessment and is grounded on the main findings and conclusions from a symposium and workshop held in Berlin in November 2021. It intends to provide further insights into how NAMs can be gradually integrated into chemical risk assessment aimed at protection of human health, until eventually the current paradigm is replaced by an animal-free "Next Generation Risk Assessment" (NGRA).


Assuntos
Inteligência Artificial , Testes de Toxicidade , Humanos , Reprodutibilidade dos Testes , Testes de Toxicidade/métodos , Medição de Risco/métodos
8.
Environ Int ; 178: 108097, 2023 08.
Artigo em Inglês | MEDLINE | ID: mdl-37478680

RESUMO

Exposure science is evolving from its traditional "after the fact" and "one chemical at a time" approach to forecasting chemical exposures rapidly enough to keep pace with the constantly expanding landscape of chemicals and exposures. In this article, we provide an overview of the approaches, accomplishments, and plans for advancing computational exposure science within the U.S. Environmental Protection Agency's Office of Research and Development (EPA/ORD). First, to characterize the universe of chemicals in commerce and the environment, a carefully curated, web-accessible chemical resource has been created. This DSSTox database unambiguously identifies >1.2 million unique substances reflecting potential environmental and human exposures and includes computationally accessible links to each compound's corresponding data resources. Next, EPA is developing, applying, and evaluating predictive exposure models. These models increasingly rely on data, computational tools like quantitative structure activity relationship (QSAR) models, and machine learning/artificial intelligence to provide timely and efficient prediction of chemical exposure (and associated uncertainty) for thousands of chemicals at a time. Integral to this modeling effort, EPA is developing data resources across the exposure continuum that includes application of high-resolution mass spectrometry (HRMS) non-targeted analysis (NTA) methods providing measurement capability at scale with the number of chemicals in commerce. These research efforts are integrated and well-tailored to support population exposure assessment to prioritize chemicals for exposure as a critical input to risk management. In addition, the exposure forecasts will allow a wide variety of stakeholders to explore sustainable initiatives like green chemistry to achieve economic, social, and environmental prosperity and protection of future generations.


Assuntos
Poluentes Ambientais , Estados Unidos , Humanos , Poluentes Ambientais/análise , United States Environmental Protection Agency , Inteligência Artificial , Gestão de Riscos , Incerteza , Exposição Ambiental/análise , Medição de Risco
9.
Front Toxicol ; 5: 1051483, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-36742129

RESUMO

Understanding the metabolic fate of a xenobiotic substance can help inform its potential health risks and allow for the identification of signature metabolites associated with exposure. The need to characterize metabolites of poorly studied or novel substances has shifted exposure studies towards non-targeted analysis (NTA), which often aims to profile many compounds within a sample using high-resolution liquid-chromatography mass-spectrometry (LCMS). Here we evaluate the suitability of suspect screening analysis (SSA) liquid-chromatography mass-spectrometry to inform xenobiotic chemical metabolism. Given a lack of knowledge of true metabolites for most chemicals, predictive tools were used to generate potential metabolites as suspect screening lists to guide the identification of selected xenobiotic substances and their associated metabolites. Thirty-three substances were selected to represent a diverse array of pharmaceutical, agrochemical, and industrial chemicals from Environmental Protection Agency's ToxCast chemical library. The compounds were incubated in a metabolically-active in vitro assay using primary hepatocytes and the resulting supernatant and lysate fractions were analyzed with high-resolution LCMS. Metabolites were simulated for each compound structure using software and then combined to serve as the suspect screening list. The exact masses of the predicted metabolites were then used to select LCMS features for fragmentation via tandem mass spectrometry (MS/MS). Of the starting chemicals, 12 were measured in at least one sample in either positive or negative ion mode and a subset of these were used to develop the analysis workflow. We implemented a screening level workflow for background subtraction and the incorporation of time-varying kinetics into the identification of likely metabolites. We used haloperidol as a case study to perform an in-depth analysis, which resulted in identifying five known metabolites and five molecular features that represent potential novel metabolites, two of which were assigned discrete structures based on in silico predictions. This workflow was applied to five additional test chemicals, and 15 molecular features were selected as either reported metabolites, predicted metabolites, or potential metabolites without a structural assignment. This study demonstrates that in some-but not all-cases, suspect screening analysis methods provide a means to rapidly identify and characterize metabolites of xenobiotic chemicals.

10.
Risk Anal ; 43(3): 498-515, 2023 03.
Artigo em Inglês | MEDLINE | ID: mdl-35460101

RESUMO

A number of investigators have explored the use of value of information (VOI) analysis to evaluate alternative information collection procedures in diverse decision-making contexts. This paper presents an analytic framework for determining the value of toxicity information used in risk-based decision making. The framework is specifically designed to explore the trade-offs between cost, timeliness, and uncertainty reduction associated with different toxicity-testing methodologies. The use of the proposed framework is demonstrated by two illustrative applications which, although based on simplified assumptions, show the insights that can be obtained through the use of VOI analysis. Specifically, these results suggest that timeliness of information collection has a significant impact on estimates of the VOI of chemical toxicity tests, even in the presence of smaller reductions in uncertainty. The framework introduces the concept of the expected value of delayed sample information, as an extension to the usual expected value of sample information, to accommodate the reductions in value resulting from delayed decision making. Our analysis also suggests that lower cost and higher throughput testing also may be beneficial in terms of public health benefits by increasing the number of substances that can be evaluated within a given budget. When the relative value is expressed in terms of return-on-investment per testing strategy, the differences can be substantial.


Assuntos
Técnicas de Apoio para a Decisão , Incerteza , Análise Custo-Benefício
12.
Toxicol Sci ; 187(1): 112-126, 2022 04 26.
Artigo em Inglês | MEDLINE | ID: mdl-35172002

RESUMO

The U.S. EPA continues to utilize high-throughput screening data to evaluate potential biological effects of endocrine active substances without the use of animal testing. Determining the scope and need for in vitro metabolism in high-throughput assays requires the generation of larger data sets that assess the impact of xenobiotic transformations on toxicity-related endpoints. The objective of the current study was to screen a set of 768 ToxCast chemicals in the VM7Luc estrogen receptor transactivation assay (ERTA) using the Alginate Immobilization of Metabolic Enzymes hepatic metabolism method. Chemicals were screened with or without metabolism to identify estrogenic effects and metabolism-dependent changes in bioactivity. Based on estrogenic hit calls, 85 chemicals were active in both assay modes, 16 chemicals were only active without metabolism, and 27 chemicals were only active with metabolism. Using a novel metabolism curve shift method that evaluates the shift in concentration-response curves, 29 of these estrogenic chemicals were identified as bioactivated and 59 were bioinactivated. Human biotransformation routes and associated metabolites were predicted in silico across the chemicals to mechanistically characterize possible transformation-related ERTA effects. Overall, the study profiled novel chemicals associated with metabolism-dependent changes in ERTA bioactivity, and suggested routes of biotransformation and putative metabolites responsible for the observed estrogenic effects. The data demonstrate a range of metabolism-dependent effects across a diverse chemical library and highlight the need to evaluate the role of intrinsic xenobiotic metabolism for endocrine and other toxicity-related health effects.


Assuntos
Disruptores Endócrinos , Animais , Disruptores Endócrinos/toxicidade , Estrogênios/toxicidade , Estrona , Receptores de Estrogênio/genética , Receptores de Estrogênio/metabolismo , Ativação Transcricional , Xenobióticos/toxicidade
13.
Comput Toxicol ; 242022 Nov 05.
Artigo em Inglês | MEDLINE | ID: mdl-36969381

RESUMO

Per- and Polyfluoroalkyl substances (PFAS) are a class of synthetic chemicals that are in widespread use and present concerns for persistence, bioaccumulation and toxicity. Whilst a handful of PFAS have been characterised for their hazard profiles, the vast majority of PFAS have not been studied. The US Environmental Protection Agency (EPA) undertook a research project to screen ~150 PFAS through an array of different in vitro high throughput toxicity and toxicokinetic tests in order to inform chemical category and read-across approaches. A previous publication described the rationale behind the selection of an initial set of 75 PFAS, whereas herein, we describe how various category approaches were applied and extended to inform the selection of a second set of 75 PFAS from our library of approximately 430 commercially procured PFAS. In particular, we focus on the challenges in grouping PFAS for prospective analysis and how we have sought to develop and apply objective structure-based categories to profile the testing library and other PFAS inventories. We additionally illustrate how these categories can be enriched with other information to facilitate read-across inferences once experimental data become available. The availability of flexible, objective, reproducible and chemically intuitive categories to explore PFAS constitutes an important step forward in prioritising PFAS for further testing and assessment.

14.
Risk Anal ; 42(4): 707-729, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-34490933

RESUMO

Regulatory agencies are required to evaluate the impacts of thousands of chemicals. Toxicological tests currently used in such evaluations are time-consuming and resource intensive; however, advances in toxicology and related fields are providing new testing methodologies that reduce the cost and time required for testing. The selection of a preferred methodology is challenging because the new methodologies vary in duration and cost, and the data they generate vary in the level of uncertainty. This article presents a framework for performing cost-effectiveness analyses (CEAs) of toxicity tests that account for cost, duration, and uncertainty. This is achieved by using an output metric-the cost per correct regulatory decision-that reflects the three elements. The framework is demonstrated in two example CEAs, one for a simple decision of risk acceptability and a second, more complex decision, involving the selection of regulatory actions. Each example CEA evaluates five hypothetical toxicity-testing methodologies which differ with respect to cost, time, and uncertainty. The results of the examples indicate that either a fivefold reduction in cost or duration can be a larger driver of the selection of an optimal toxicity-testing methodology than a fivefold reduction in uncertainty. Uncertainty becomes of similar importance to cost and duration when decisionmakers are required to make more complex decisions that require the determination of small differences in risk predictions. The framework presented in this article may provide a useful basis for the identification of cost-effective methods for toxicity testing of large numbers of chemicals.


Assuntos
Testes de Toxicidade , Análise Custo-Benefício , Incerteza
15.
ALTEX ; 39(1): 123-139, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-34818430

RESUMO

Internationally, there are thousands of existing and newly introduced chemicals in commerce, highlighting the ongoing importance of innovative approaches to identify emerging chemicals of concern. For many chemicals, there is a paucity of hazard and exposure data. Thus, there is a crucial need for efficient and robust approaches to address data gaps and support risk-based prioritization. Several studies have demonstrated the utility of in vitro bioactivity data from the ToxCast program in deriving points of departure (PODs). ToxCast contains data for nearly 1,400 endpoints per chemical, and the bioactivity concentrations, indicative of potential adverse outcomes, can be converted to human-equivalent PODs using high-throughput toxicokinetics (HTTK) modeling. However, data gaps need to be addressed for broader application: the limited chemical space of HTTK and quantitative high-throughput screening data. Here we explore the applicability of in silico models to address these data needs. Specifically, we used ADMET predictor for HTTK predictions and a generalized read-across approach to predict ToxCast bioactivity potency. We applied these models to profile 5,801 chemicals on Canada's Domestic Substances List (DSL). To evaluate the approach's performance, bioactivity PODs were compared with in vivo results from the EPA Toxicity Values database for 1,042 DSL chemicals. Comparisons demonstrated that the bioac­tivity PODs, based on ToxCast data or read-across, were conservative for 95% of the chemicals. Comparing bioactivity PODs to human exposure estimates supports the identification of chemicals of potential interest for further work. The bioac­tivity workflow shows promise as a powerful screening tool to support effective triaging of chemical inventories.


Assuntos
Ensaios de Triagem em Larga Escala , Bases de Dados Factuais , Humanos , Medição de Risco , Toxicocinética
16.
Regul Toxicol Pharmacol ; 125: 105020, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-34333066

RESUMO

Omics methodologies are widely used in toxicological research to understand modes and mechanisms of toxicity. Increasingly, these methodologies are being applied to questions of regulatory interest such as molecular point-of-departure derivation and chemical grouping/read-across. Despite its value, widespread regulatory acceptance of omics data has not yet occurred. Barriers to the routine application of omics data in regulatory decision making have been: 1) lack of transparency for data processing methods used to convert raw data into an interpretable list of observations; and 2) lack of standardization in reporting to ensure that omics data, associated metadata and the methodologies used to generate results are available for review by stakeholders, including regulators. Thus, in 2017, the Organisation for Economic Co-operation and Development (OECD) Extended Advisory Group on Molecular Screening and Toxicogenomics (EAGMST) launched a project to develop guidance for the reporting of omics data aimed at fostering further regulatory use. Here, we report on the ongoing development of the first formal reporting framework describing the processing and analysis of both transcriptomic and metabolomic data for regulatory toxicology. We introduce the modular structure, content, harmonization and strategy for trialling this reporting framework prior to its publication by the OECD.


Assuntos
Metabolômica/normas , Organização para a Cooperação e Desenvolvimento Econômico/normas , Toxicogenética/normas , Toxicologia/normas , Transcriptoma/fisiologia , Documentação/normas , Humanos
17.
Toxicol Sci ; 181(1): 68-89, 2021 04 27.
Artigo em Inglês | MEDLINE | ID: mdl-33538836

RESUMO

New approach methodologies (NAMs) that efficiently provide information about chemical hazard without using whole animals are needed to accelerate the pace of chemical risk assessments. Technological advancements in gene expression assays have made in vitro high-throughput transcriptomics (HTTr) a feasible option for NAMs-based hazard characterization of environmental chemicals. In this study, we evaluated the Templated Oligo with Sequencing Readout (TempO-Seq) assay for HTTr concentration-response screening of a small set of chemicals in the human-derived MCF7 cell model. Our experimental design included a variety of reference samples and reference chemical treatments in order to objectively evaluate TempO-Seq assay performance. To facilitate analysis of these data, we developed a robust and scalable bioinformatics pipeline using open-source tools. We also developed a novel gene expression signature-based concentration-response modeling approach and compared the results to a previously implemented workflow for concentration-response analysis of transcriptomics data using BMDExpress. Analysis of reference samples and reference chemical treatments demonstrated highly reproducible differential gene expression signatures. In addition, we found that aggregating signals from individual genes into gene signatures prior to concentration-response modeling yielded in vitro transcriptional biological pathway altering concentrations (BPACs) that were closely aligned with previous ToxCast high-throughput screening assays. Often these identified signatures were associated with the known molecular target of the chemicals in our test set as the most sensitive components of the overall transcriptional response. This work has resulted in a novel and scalable in vitro HTTr workflow that is suitable for high-throughput hazard evaluation of environmental chemicals.


Assuntos
Ensaios de Triagem em Larga Escala , Transcriptoma , Animais , Bioensaio , Biologia Computacional , Humanos , Medição de Risco
18.
Chem Res Toxicol ; 34(2): 189-216, 2021 02 15.
Artigo em Inglês | MEDLINE | ID: mdl-33140634

RESUMO

Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxicity across research and regulatory domains. Each Tox21 federal partner brought specialized resources and capabilities to the partnership, including three approximately equal-sized compound libraries. All Tox21 data generated to date have resulted from a confluence of ideas, technologies, and expertise used to design, screen, and analyze the Tox21 10K library. The different programmatic objectives of the partners led to three distinct, overlapping compound libraries that, when combined, not only covered a diversity of chemical structures, use-categories, and properties but also incorporated many types of compound replicates. The history of development of the Tox21 "10K" chemical library and data workflows implemented to ensure quality chemical annotations and allow for various reproducibility assessments are described. Cheminformatics profiling demonstrates how the three partner libraries complement one another to expand the reach of each individual library, as reflected in coverage of regulatory lists, predicted toxicity end points, and physicochemical properties. ToxPrint chemotypes (CTs) and enrichment approaches further demonstrate how the combined partner libraries amplify structure-activity patterns that would otherwise not be detected. Finally, CT enrichments are used to probe global patterns of activity in combined ToxCast and Tox21 activity data sets relative to test-set size and chemical versus biological end point diversity, illustrating the power of CT approaches to discern patterns in chemical-activity data sets. These results support a central premise of the Tox21 program: A collaborative merging of programmatically distinct compound libraries would yield greater rewards than could be achieved separately.


Assuntos
Bibliotecas de Moléculas Pequenas/toxicidade , Testes de Toxicidade , Ensaios de Triagem em Larga Escala , Humanos , Estados Unidos , United States Environmental Protection Agency
19.
Toxicol Sci ; 178(2): 281-301, 2020 12 01.
Artigo em Inglês | MEDLINE | ID: mdl-32991717

RESUMO

The U.S. EPA Endocrine Disruptor Screening Program utilizes data across the ToxCast/Tox21 high-throughput screening (HTS) programs to evaluate the biological effects of potential endocrine active substances. A potential limitation to the use of in vitro assay data in regulatory decision-making is the lack of coverage for xenobiotic metabolic processes. Both hepatic- and peripheral-tissue metabolism can yield metabolites that exhibit greater activity than the parent compound (bioactivation) or are inactive (bioinactivation) for a given biological target. Interpretation of biological effect data for both putative endocrine active substances, as well as other chemicals, screened in HTS assays may benefit from the addition of xenobiotic metabolic capabilities to decrease the uncertainty in predicting potential hazards to human health. The objective of this study was to develop an approach to retrofit existing HTS assays with hepatic metabolism. The Alginate Immobilization of Metabolic Enzymes (AIME) platform encapsulates hepatic S9 fractions in alginate microspheres attached to 96-well peg lids. Functional characterization across a panel of reference substrates for phase I cytochrome P450 enzymes revealed substrate depletion with expected metabolite accumulation. Performance of the AIME method in the VM7Luc estrogen receptor transactivation assay was evaluated across 15 reference chemicals and 48 test chemicals that yield metabolites previously identified as estrogen receptor active or inactive. The results demonstrate the utility of applying the AIME method for identification of false-positive and false-negative target assay effects, reprioritization of hazard based on metabolism-dependent bioactivity, and enhanced in vivo concordance with the rodent uterotrophic bioassay. Integration of the AIME metabolism method may prove useful for future biochemical and cell-based HTS applications.


Assuntos
Alginatos/química , Disruptores Endócrinos , Enzimas Imobilizadas/química , Fígado/enzimologia , Receptores de Estrogênio , Animais , Bioensaio , Ensaios de Triagem em Larga Escala , Receptores de Estrogênio/metabolismo , Roedores , Testes de Toxicidade , Ativação Transcricional
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...